Spiking Boltzmann Machines
نویسندگان
چکیده
We rst show how to represent sharp posterior probability distributions using real valued coe cients on broadly-tuned basis functions. Then we show how the precise times of spikes can be used to convey the real-valued coe cients on the basis functions quickly and accurately. Finally we describe a simple simulation in which spiking neurons learn to model an image sequence by tting a dynamic generative model. 1 Population codes and energy landscapes A perceived object is represented in the brain by the activities of many neurons, but there is no general consensus on how the activities of individual neurons combine to represent the multiple properties of an object. We start by focussing on the case of a single object that has multiple instantiation parameters such as position, velocity, size and orientation. We assume that each neuron has an ideal stimulus in the space of instantiation parameters and that its activation rate or probability of activation falls o monotonically in all directions as the actual stimulus departs from this ideal. The semantic problem is to de ne exactly what instantiation parameters are being represented when the activities of many such neurons are speci ed. Hinton, Rumelhart and McClelland (1986) consider binary neurons with receptive elds that are convex in instantiation space. They assume that when an object is present it activates all of the neurons in whose receptive elds its instantiation parameters lie. Consequently, if it is known that only one object is present, the parameter values of the object must lie within the feasible region formed by the intersection of the receptive elds of the active neurons. This will be called a conjunctive distributed representation. Assuming that each receptive eld occupies only a small fraction of the whole space, an interesting property of this type of \coarse coding" is that the bigger the receptive elds, the more accurate the representation. However, large receptive elds lead to a loss of resolution when several objects are present simultaneously. When the sensory input is noisy, it is impossible to infer the exact parameters of objects so it makes sense for a perceptual system to represent the probability distribution across parameters rather than just a single best estimate or a feasible region. The full probability distribution is essential for correctly combining infor-
منابع مشابه
Implementation of a Restricted Boltzmann Machine in a Spiking Neural Network
Restricted Boltzmann Machines (RBMs) have been demonstrated to perform efficiently on a variety of applications, such as dimensionality reduction and classification. Implementing RBMs on neuromorphic hardware has certain advantages, particularly from a concurrency and lowpower perspective. This paper outlines some of the requirements involved for neuromorphic adaptation of an RBM and attempts t...
متن کاملUnsupervised Learning in Synaptic Sampling Machines
Recent studies have shown that synaptic unreliability is a robust and sufficient mechanism for inducing the stochasticity observed in cortex. Here, we introduce the Synaptic Sampling Machine (SSM), a stochastic neural network model that uses synaptic unreliability as a means to stochasticity for sampling. Synaptic unreliability plays the dual role of an efficient mechanism for sampling in neuro...
متن کاملEvent-driven contrastive divergence for spiking neuromorphic systems
Restricted Boltzmann Machines (RBMs) and Deep Belief Networks have been demonstrated to perform efficiently in a variety of applications, such as dimensionality reduction, feature learning, and classification. Their implementation on neuromorphic hardware platforms emulating large-scale networks of spiking neurons can have significant advantages from the perspectives of scalability, power dissi...
متن کاملModeling Laminar Recordings from Visual Cortex with Semi-Restricted Boltzmann Machines
The proliferation of high density recording techniques presents us with new challenges for characterizing the statistics of neural activity over populations of many neurons. The Ising model, which is the maximum entropy model for pairwise correlations, has been used to model the instantaneous state of a population of neurons. This model suffers from two major limitations: 1) Estimation for larg...
متن کاملNotes on Boltzmann Machines
I. INTRODUCTION Boltzmann machines are probability distributions on high dimensional binary vectors which are analogous to Gaussian Markov Random Fields in that they are fully determined by first and second order moments. A key difference however is that augmenting Boltzmann machines with hidden variables enlarges the class of distributions that can be modeled, so that in principle it is possib...
متن کاملLearning and Evaluating Boltzmann Machines
We provide a brief overview of the variational framework for obtaining deterministic approximations or upper bounds for the log-partition function. We also review some of the Monte Carlo based methods for estimating partition functions of arbitrary Markov Random Fields. We then develop an annealed importance sampling (AIS) procedure for estimating partition functions of restricted Boltzmann mac...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1999